Sequential Inference for Deep Gaussian Process
نویسندگان
چکیده
A deep Gaussian process (DGP) is a deep network in which each layer is modelled with a Gaussian process (GP). It is a flexible model that can capture highly-nonlinear functions for complex data sets. However, the network structure of DGP often makes inference computationally expensive. In this paper, we propose an efficient sequential inference framework for DGP, where the data is processed sequentially. We also propose two DGP extensions to handle heteroscedasticity and multi-task learning. Our experimental evaluation shows the effectiveness of our sequential inference framework on a number of important learning tasks.
منابع مشابه
Supplementary Material: Sequential Inference for Deep Gaussian Process
In this section we briefly review sparse online GPs (GPso) [1, 2]. The key idea is to learn GPs recursively by updating the posterior mean and covariance of the training set {(x, y)}n=1 in a sequential fashion. This online procedure is coupled with a sparsification mechanism in which a fixed-size subset of the training set (called the active set) is iteratively selected to avoid the unbounded c...
متن کاملRecurrent Gaussian Processes
We define Recurrent Gaussian Processes (RGP) models, a general family of Bayesian nonparametric models with recurrent GP priors which are able to learn dynamical patterns from sequential data. Similar to Recurrent Neural Networks (RNNs), RGPs can have different formulations for their internal states, distinct inference methods and be extended with deep structures. In such context, we propose a ...
متن کاملRandom Feature Expansions for Deep Gaussian Processes
The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work we introduce a novel formulation of DGPs b...
متن کاملApproximate Inference in Deep GPs∗
In this talk we will review deep Gaussian process models and relate them to neural network models. We will then consider the details of how variational inference may be performed in these models. The approach is centred on “variational compression”, an approach to variational inference that compresses information into an augmented variable space. The aim of the deep Gaussian process framework i...
متن کاملAccelerating Deep Gaussian Processes Inference with Arc-Cosine Kernels
Deep Gaussian Processes (DGPs) are probabilistic deep models obtained by stacking multiple layers implemented through Gaussian Processes (GPs). Although attractive from a theoretical point of view, learning DGPs poses some significant computational challenges that arguably hinder their application to a wider variety of problems for which Deep Neural Networks (DNNs) are the preferred choice. We ...
متن کامل